Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Differential private average publishing of numerical stream data for wearable devices
TU Zixuan, LIU Shubo, XIONG Xingxing, ZHAO Jing, CAI Zhaohui
Journal of Computer Applications    2020, 40 (6): 1692-1697.   DOI: 10.11772/j.issn.1001-9081.2019111929
Abstract318)      PDF (709KB)(321)       Save
User health data such as heart rate and blood glucose generated by wearable devices in real time is of great significance for health monitoring and disease diagnosis. However, health data is private information of users. In order to publish the average value of numerical stream data for wearable devices and prevent the leakage of users’ privacy information, a new differential private average publishing method of wearable devices based on adaptive sampling was proposed. Firstly, the global sensitivity was introduced which was adaptive to the characteristic of small fluctuation of stream data average for wearable devices. Then, the privacy budget was allocated by the adaptive sampling based on Kalman filter error adjustment, so as to improve the availability of the published data. In the experiments of two kinds of health data publishing, while the privacy budget is 0.1, which means that the level of privacy protection is high, the Mean Relative Errors (MRE) of the proposed method on the heart rate dataset and blood glucose dataset are only 0.01 and 0.08, which are 36% and 33% lower than those of Filtering and Adaptive Sampling for differential private Time-series monitoring (FAST) algorithm. The proposed method can improve the usability of wearable devices’ stream data publishing.
Reference | Related Articles | Metrics
Unified algorithm for scattered point cloud denoising and simplification
ZHAO Jingdong, YANG Fenghua, GUO Yingxin
Journal of Computer Applications    2017, 37 (10): 2879-2883.   DOI: 10.11772/j.issn.1001-9081.2017.10.2879
Abstract486)      PDF (864KB)(408)       Save
Since it is difficult to denoise and simplify a three dimensional point cloud data by a same parameter, a new unified algorithm based on the Extended Surface Variation based Local Outlier Factor (ESVLOF) for denoising and simplification of scattered point cloud was proposed. Through the analysis of the definition of ESVLOF, its properties were given. With the help of the surface variability computed in denoising process and the default similarity coefficient, the parameter γ which decreased with the increase of surface variation was constructed. Then the parameter γ was used as local threshold for denoising and simplifying point cloud. The simulation results show that this method can preserve the geometric characteristics of the original data. Compared with traditional 3D point-cloud preprocessing, the efficiency of this method is nearly doubled.
Reference | Related Articles | Metrics
Fast reconstruction algorithm for photoacoustic computed tomography in vivo
JIANG Zibo, ZHAO Jingxiu, ZHANG Yuanke, MENG Jing
Journal of Computer Applications    2016, 36 (3): 811-814.   DOI: 10.11772/j.issn.1001-9081.2016.03.811
Abstract448)      PDF (602KB)(403)       Save
Focusing on the issue that the data acquisition amount of Photoacoustic Computed Tomography (PACT) based on ultrasonic array is generally huge, and the imaging process is time-consuming, a fast photoacoustic computed tomography method with Principal Component Analysis (PCA) was proposed to extend its applications to the field of hemodynamics. First, the matrix of image samples was constructed with part of full-sampling data. Second, the projection matrix representing the signal features could be derived by the decomposition of the sample matrix. Finally, the high-quality three-dimensional photoacoustic images could be recovered by this projection matrix under three-fold under-sampling. The experimental results on vivo back-vascular imaging of a rat show that, compared to the traditional back-projection method, the data acquisition amount of PACT using PCA can be decreased by about 35%, and the three-dimensional reconstruction speed is improved by about 40%. As a result, both the fast data acquisition and high-accurate image reconstruction are implemented successfully.
Reference | Related Articles | Metrics
Image denoising algorithm based on sparse representation and nonlocal similarity
ZHAO Jingkun, ZHOU Yingyue, LIN Maosong
Journal of Computer Applications    2016, 36 (2): 551-555.   DOI: 10.11772/j.issn.1001-9081.2016.02.0551
Abstract702)      PDF (1050KB)(954)       Save
For the problem of denoising images corrupted by mixed noise such as Additive White Gaussian Noise (AWGN) with Salt-and-Pepper Impulse Noise (SPIN) and Random-Valued Impulse Noise (RVIN), an improved image restoration algorithm on the basis of the existing weighted encoding method was proposed. The image priors about sparse representation and non-local similarity were integrated. Firstly, the sparse representation based on the dictionary was used to build a variational denoising model and a weighting factor was designed for data fidelity term to suppress impulse noise. Secondly, the method of non-local means was used to get an initialized denoised image and then a mask matrix was built to remove impulse noise points to get the good non-local similarity prior knowledge. Finally, the image sparsity prior and non-local similarity prior were integrated into the regularization of the variational model. The final denoised image was obtained by solving the variational model. The experimental results show that in different noise ratios, the Peak Signal-to-Noise Ratio (PSNR) of the proposed algorithm increased 1.7 dB than that of fuzzy weighted non-local means filter, and the Feature Similarity Index (FSIM) increased 0.06. Compared with weighted encoding method, the PSNR increased 0.64 dB, and the FSIM increased 0.03. The proposed method has better recovery performance especially for the texture strong images and can retain real information of the image.
Reference | Related Articles | Metrics
K-nearest neighbor searching algorithm for laser scattered point cloud
ZHAO Jingdong, YANG Fenghua
Journal of Computer Applications    2016, 36 (10): 2863-2869.   DOI: 10.11772/j.issn.1001-9081.2016.10.2863
Abstract495)      PDF (1113KB)(364)       Save
Aiming at the problem of large amount of data and characteristics of surface in laser scattered point cloud, a K-Nearest Neighbors (KNN) searching algorithm for laser scattered point cloud was put forward to reduce memory usage and improve processing efficiency. Firstly, only the non-empty subspace numbers were stored by multistage classification and dynamic linked list storage. Adjacent subspace was coded in ternary, and the pointer connection between adjacent subspaces was established by dual relationship of code, a generalized table that contained all kinds of required information for KNN searching was constructed, then KNN were searched. In the process of KNN searching, the candidate points outside inscribed sphere of filtration cube were directly deleted when calculating the distance from measured point to candidate points, thus reducing the candidate points that participate in the sort by distance to half. Both dividing principles, whether it relies on K value or not, can be used to calculate different K neighborhoods. Experimental results prove that the proposed algorithm not only has low memory usage, but also has high efficiency.
Reference | Related Articles | Metrics
Polynomial interpolation algorithm framework based on osculating polynomial approximation
ZHAO Xiaole, WU Yadong, ZHANG Hongying, ZHAO Jing
Journal of Computer Applications    2015, 35 (8): 2266-2273.   DOI: 10.11772/j.issn.1001-9081.2015.08.2266
Abstract509)      PDF (1379KB)(346)       Save

Polynomial interpolation technique is a common approximation method in approximation theory, which is widely used in numerical analysis, signal processing, and so on. Traditional polynomial interpolation algorithms are mainly developed by combining numerical analysis with experimental results, lacking of unified theoretical description and regular solution. A uniform theoretical framework for polynomial interpolation algorithm based on osculating polynomial approximation theory was proposed here. Existing interpolation algorithms could be analyzed and new algorithms could be developed under this framework, which consists of the number of sample points, osculating order for sample points and derivative approximation rules. The presentation of existing mainstream interpolation algorithms was analyzed in proposed framework, and the general process for developing new algorithms was shown by using a four-point and two-order osculating polynomial interpolation. Theoretical analysis and numerical experiments show that almost all mainstream polynomial interpolation algorithms belong to osculating polynomial interpolation, and their effects are strongly related to the number of sampling points, order of osculating, and derivative approximation rules.

Reference | Related Articles | Metrics
Near outlier detection of scattered point cloud
ZHAO Jingdong, YANG Fenghua, LIU Aijng
Journal of Computer Applications    2015, 35 (4): 1089-1092.   DOI: 10.11772/j.issn.1001-9081.2015.04.1089
Abstract670)      PDF (747KB)(578)       Save

Concerning that the original Surface Variation based Local Outlier Factor (SVLOF) cannot filter out the outliers on edges or corners of three-dimensional solid, a new near outlier detection algorithm of scattered point cloud was proposed. This algorithm firstly defined SVLOF on the k neighborhood-like region, and expanded the definition of SVLOF. The expanded SVLOF can not only filter outliers on smooth surface but also filter outliers on edges or corners of three-dimensional solid. At the same time, it still retains the space of threshold value enough of original SVLOF. The experimental results of the simulation data and measured data show that the new algorithm can detect the near outliers of scattered point cloud effectively without changing the efficiency obviously.

Reference | Related Articles | Metrics
Certificateless signcryption with online/offline technique
ZHAO Jingjing ZHAO Xuexia SHI Yuerong
Journal of Computer Applications    2014, 34 (9): 2659-2663.   DOI: 10.11772/j.issn.1001-9081.2014.09.2659
Abstract231)      PDF (759KB)(439)       Save

Signcryption as a cryptographic primitive is a splendid combination of signature with authentication and encryption with confidentiality simultaneously. Online/offline signcryption, with the online/offline technique, provides higher efficiency for the system. However, most of the present signcryption schemes are implemented in the identity-based setting in which there exists key escrow problem. Based on the certificateless cryptography system's advantages with revocation of certificate management and without key escrow problem, a secure online/offline certificateless signcryption scheme was proposed. The proposed scheme satisfied the requirement that there is no need to determine the recipient's information in the offline stage. Moreover, its security was proved in the Random Oracle Model (ROM).

Reference | Related Articles | Metrics
Entropy-based evaluation method of weighted network invulnerability
ZHAO Jingxian
Journal of Computer Applications    2014, 34 (9): 2627-2629.   DOI: 10.11772/j.issn.1001-9081.2014.09.2627
Abstract218)      PDF (433KB)(357)       Save

In order to study the invulnerability of weighted network after being partially destroyed, combining the stability of network topology structure and the network flow by calculating the contribution degree to flow by nonoverlapping path between nodes and adopting entropy concept, a standard stability entropy index was proposed to evaluate the invulnerability between nodes in fully-connected network as a reference. On this basis, a model to evaluate the whole network invulnerability was given. The simulation results show that network invulnerability not only relates to network topology and the capacity sum of all edge weights, but also relates to the uniformity of all edge weights. The more uniform the critical edge weights are, the more strong the overall invulnerability is.

Reference | Related Articles | Metrics
Reverse curvature-driven super-resolution algorithm based on Taylor formula
ZHAO Xiaole WU Yadong ZHANG Hongying ZHAO Jing
Journal of Computer Applications    2014, 34 (12): 3570-3575.  
Abstract164)      PDF (948KB)(574)       Save

To solve the problem of traditional interpolation and model-based methods usually leading to decrease of the contrast and sharpness of images, a reverse curvature-driven Super-Resolution (SR) algorithm based on Taylor formula was proposed. The algorithm used the Taylor formula to estimate the change trend of image intensity, and then the image edge features were detailed by the curvature of isophote. Gradients were used as constraints to inhibit the jagged edges and ringing effects. The experimental resluts show that the proposed algorithm has obvious advantages over the conventional interpolation algorithm and model-based methods in clarity and information retention, and its result is more in line with human visual effects. The proposed algorithm is more effective than traditional iterative algorithms for reverse diffusion based on Taylor expansion is implemented.

Reference | Related Articles | Metrics
Incremental learning method of Bayesian classification combined with feedback information
XU Ming-ying WEI Yong-qing ZHAO Jing
Journal of Computer Applications    2011, 31 (09): 2530-2533.   DOI: 10.3724/SP.J.1087.2011.02530
Abstract1795)      PDF (634KB)(380)       Save
Owing to the insufficiency of the training sets, the performance of the initial classifier is not satisfactory and can not track the users' needs dynamically. Concerning the defect, an incremental learning method of Bayesian classifier combined with feedback information was proposed. To reduce the redundancy between features effectively and improve representative ability of feedback feature subset, an improved feature selection method based on Genetic Algorithm (GA) was used to choose the best features from feedback sets to amend classifier. The experimental results show that the algorithm optimizes classification significantly and has good overall stability.
Related Articles | Metrics
Color image edge detection with region homogeneous measure
ZHENG Mei-zhu ZHAO Jing-xiu
Journal of Computer Applications    2011, 31 (09): 2485-2488.   DOI: 10.3724/SP.J.1087.2011.02485
Abstract1508)      PDF (924KB)(21172)       Save
Color image processing and analysis were implemented in HSI color space, concerning the difficulty in distinguishing the color similarity effectively in RGB color space. Firstly, the chromatic aberration components of hue, saturation, and intensity were calculated. Then through quoting fuzzy entropy, a group of information measures based on the fuzzy entropy were constructed to describe the natural characteristics of image edge quantitatively. Four component vectors were gotten via trained image samples, BP neural network was trained with some eigenvector of this four component vectors, and in the end the trained BP neural network was used for edge detection directly. Both the architecture and training of the BP neural are simple. Moreover, the proposed edge detector needs no threshold for conventional edge detection and has strong retention capacity of the details.
Related Articles | Metrics
Dynamic taint analysis based on virtual technology
CHEN Yan-ling ZHAO Jing
Journal of Computer Applications    2011, 31 (09): 2367-2372.   DOI: 10.3724/SP.J.1087.2011.02367
Abstract1247)      PDF (951KB)(544)       Save
The record of the current taint analysis tool is not accurate. To solve this, dynamic taint analysis based on the virtual technology was studied and implemented. A virtualization based dynamic taint analysis framework was designed, and two kinds of taint signature models based on Hook technology and Hash-traversal technology were given respectively for memory taint and hard disk taint. A taint propagation strategy was put forward according to the instruction type which was classified by instruction encoding format of Inter&AMD, and a taint record strategy based on instruction filtering was given to solve the problem of redundant information records. The experimental results prove that the proposed method is effective, and can be well used in test case generation and vulnerability detection of fuzzy test.
Related Articles | Metrics
Infrared small target detection algorithm based on improved two-sliding-window
LIU Xing-miao WANG Shi-cheng ZHAO Jing HU Bo
Journal of Computer Applications    2011, 31 (05): 1217-1220.   DOI: 10.3724/SP.J.1087.2011.01217
Abstract1442)      PDF (675KB)(967)       Save
The temporal domain characteristic of the infrared image and different features of small targets, noise and background were analyzed in this paper. A new infrared small target detection algorithm combining temporal domain and spatial domain was put forward. Because of the slow change of background, the Signal-to-Noise Ratio (SNR) of the small target was first enhanced through subtracting the conjoint frames. Then the potential small targets were detected by applying the centre distinguishing method, and the two-sliding-window algorithm was adopted to remove the isolated noise. At last, the similarity distinguishing method was used to eliminate the edge disturbance and the final detection of the small target was realized. The experimental results indicate that the improved algorithm has better target detection and real-time performance.
Related Articles | Metrics